
8 Unobserved Confounding: Bounds and Sensitivity Analysis 85
𝑇:=𝛼𝑤𝑊+𝛼𝑢𝑈(8.61 revisited)
[62]: Rosenbaum and Rubin (1983), ‘Assess-
ing Sensitivity to an Unobserved Binary
Covariate in an Observational Study with
Binary Outcome’
[63]: Imbens (2003), ‘Sensitivity to Exo-
geneity Assumptions in Program Evalua-
tion’
16
Imbens [63] is the first to introduce con-
tour plots like the ones in our Figure 8.5.
[64]: Cinelli and Hazlett (2020), ‘Making
sense of sensitivity: extending omitted
variable bias’
[65]: Veitch and Zaveri (2020), Sense and
Sensitivity Analysis: Simple Post-Hoc Analy-
sis of Bias Due to Unobserved Confounding
[66]: Liu et al. (2013), ‘An introduction to
sensitivity analysis for unobserved con-
founding in nonexperimental prevention
research’
[67]: Rosenbaum (2002), Observational
Studies
[68]: Rosenbaum (2010), Design of Observa-
tional Studies
[69]: Rosenbaum (2017), Observation and
Experiment
[70]: Franks et al. (2019), ‘Flexible Sensi-
tivity Analysis for Observational Studies
Without Observable Implications’
[71]: Yadlowsky et al. (2020), Bounds on the
conditional and average treatment effect with
unobserved confounding factors
[72]: Vanderweele and Arah (2011), ‘Bias
formulas for sensitivity analysis of unmea-
sured confounding for general outcomes,
treatments, and confounders’
[73]: Ding and VanderWeele (2016), ‘Sen-
sitivity Analysis Without Assumptions’
In the example we depict in Figure 8.5, the figure tells us that the green
curve (third from the bottom/left) indicates how strong the confounding
would need to be in order to completely explain the observed association.
In other words,
(1
𝛼𝑢,𝛽𝑢)
would need be large enough to fall on the green
curve or above in order for the true ATE
𝛿
to be zero or the opposite sign
of 𝔼𝑊[𝔼[𝑌|𝑇=1, 𝑊] − 𝔼[𝑌|𝑇=0, 𝑊]]=25.
8.2.2 More General Settings
We consider a simple linear setting in Section 8.2.1 in order to easily
convey the important concepts in sensitivity analysis. However, there
is existing that allows us to do sensitivity analysis in more general
settings.
Say we are in the common setting where
𝑇
is binary. This is not the case
in the previous section (see Equation 8.61 ). Rosenbaum and Rubin [62]
and Imbens [63]
16
consider a simple binary treatment setting with binary
𝑈
by just putting a logistic sigmoid function around the right-hand side
of Equation 8.61 and using that for the probability of treatment instead
of the actual value of treatment:
𝑃(𝑇=1|𝑊, 𝑈):=1
1+exp(−(𝛼𝑤𝑊+𝛼𝑢𝑈)) (8.78)
No Assumptions on 𝑇or 𝑈
Fortunately, we can drop a lot of the
assumptions that we’ve seen so far. Unlike the linear form that we
assumed for
𝑇
in Section 8.2.1 and the linearish form that Rosenbaum
and Rubin [62] and Imbens [63] assume, Cinelli and Hazlett [64] develop
a method for sensitivity analysis that is agnostic to the functional form
of 𝑇. Their method also allows for 𝑈to be non-binary and for 𝑈to be a
vector, rather than just a single unobserved confounder.
Arbitrary Machine Learning Models for Parametrization of 𝑇and 𝑌
Recall that all of the estimators that we considered in Chapter 7 allowed
us to plug in arbitrary machine learning models to get model-assisted
estimators. It might be attractive to have an analogous option in sensitivity
analysis, potentially using the exact same models for the conditional
outcome model
𝜇
and the propensity score
𝑒
that we used for estimation.
And this is exactly what Veitch and Zaveri [65] give us. And they
are even able to derive a closed-form expression for confounding bias,
assuming the models we use for
𝜇
and
𝑒
are well-specified, something
that Rosenbaum and Rubin [62] and Imbens [63] didn’t do in their simple
setting.
Holy Shit; There Are a Lot of Options
Although we only highlighted
a few options above, there are many different approaches to sensitivity
analysis, and people don’t agree on which ones are best. This means that
sensitivity analysis is an active area of current research. See Liu et al.
[66] for a review of methods that preceeded 2013. Rosenbaum is another
key figure in sensitivity analysis with his several different approaches
[67–69]. Here is a non-exhaustive list of a few other flexible sensitivity
analysis methods that you might be interested in looking into: Franks
et al. [70], Yadlowsky et al. [71], Vanderweele and Arah [72], and Ding
and VanderWeele [73].